翻訳と辞書
Words near each other
・ Tensor (intrinsic definition)
・ Tensor algebra
・ Tensor calculus
・ Tensor contraction
・ Tensor Contraction Engine
・ Tensor density
・ Tensor derivative (continuum mechanics)
・ Tensor fasciae latae muscle
・ Tensor field
・ Tensor glyph
・ Tensor network theory
・ Tensor operator
・ Tensor product
・ Tensor product bundle
・ Tensor product model transformation
Tensor product network
・ Tensor product of algebras
・ Tensor product of fields
・ Tensor product of graphs
・ Tensor product of Hilbert spaces
・ Tensor product of modules
・ Tensor product of quadratic forms
・ Tensor rank decomposition
・ Tensor software
・ Tensor Trucks
・ Tensor tympani muscle
・ Tensor veli palatini muscle
・ Tensor-hom adjunction
・ TensorFlow
・ Tensors in curvilinear coordinates


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Tensor product network : ウィキペディア英語版
Tensor product network

A tensor product network, in artificial neural networks, is a network that exploits the properties of tensors to model associative concepts such as variable assignment. Orthonormal vectors are chosen to model the ideas (such as variable names and target assignments), and the tensor product of these vectors construct a network whose mathematical properties allow the user to easily extract the association from it.
;Ranked Tensors
A rank 2 tensor can store an arbitrary binary relation
;Teaching Mode
The network learns which variables have fillers (symbols) when vectors representing a variable and a filler are presented to the two sides of the network
The teaching is one-shot (vs iterative learning used by backprop & other settling schemes), whereby nothing is annealed or repeatedly adjusted, and no stopping criterion applies
;Method
Teaching is accomplished by adjusting the value of the binding unit memory.
If the i-th component of the filler vector is fi and the j-th component of the variable vector is vj
Then fivj is added to bij (the (i,j)-th binding unit memory) for each i and j
Similarly, regard the binding units as a matrix B, and the filler and variable as column vectors f and v. Then what we are doing is forming the outer product fvT and adding it to B
B'=B+fvT
;Retrieval Mode
For exact retrieval:
*Vectors used to represent variables & fillers must be orthogonal to each other
*Refer to these sets of vectors as an orthonormal set
*As such, these vectors are linearly independent
If the matrix/tensor has m rows and n columns, then it can represent at most m fillers and n variables
Method
*Retrieval is accomplished by computing dot products
To retrieve the value/filler for a variable (vj) from a rank 2 tensor with binding unit values bij:
compute fi=jbijvj for each i, where the resulting vector fi represents the filler
To compute whether variable variable (vj) has filler (fi):
computer D=ijbijvjfi, where D is a boolean (1 or 0)
==See also==

* Neural network
* Neuroscience


抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Tensor product network」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.